# Python code generation

Opencodereasoning Nemotron 14B GGUF
Apache-2.0
OpenCodeReasoning-Nemotron-14B is a large language model based on Qwen2.5-14B-Instruct with post-training, specifically optimized for code generation and reasoning, supporting a 32K token context length.
Large Language Model Supports Multiple Languages
O
Mungert
507
1
Opencodereasoning Nemotron 32B GGUF
Apache-2.0
OpenCodeReasoning-Nemotron-32B is a code generation reasoning model based on Qwen2.5-32B-Instruct, supporting 32K tokens context length, suitable for both commercial and non-commercial use.
Large Language Model Supports Multiple Languages
O
Mungert
633
1
Pycodet5
Apache-2.0
PyCodeT5 is a specialized variant of the CodeT5 model, fine-tuned specifically for generating and understanding Python functions, capable of converting natural language descriptions into functional Python code.
Large Language Model Transformers Supports Multiple Languages
P
S-Dreamer
31
1
Phi 3 Mini 4k Python
Apache-2.0
A Python code generation model fine-tuned based on unsloth/Phi-3-mini-4k-instruct-bnb-4bit, trained using Unsloth and TRL libraries with 2x speed improvement
Large Language Model English
P
theprint
175
1
Codellama 13b Python Hf
Code Llama is a series of pre-trained and fine-tuned generative text models released by Meta with parameter scales ranging from 7 billion to 34 billion. This model is the 13-billion-parameter Python specialized version
Large Language Model Transformers Other
C
meta-llama
636
7
Codellama 70B Python GPTQ
CodeLlama 70B Python is a large language model focused on the Python programming language, based on the Llama 2 architecture, optimized for code generation and completion tasks.
Large Language Model Transformers Other
C
TheBloke
89
19
Tinycode Python
MIT
This model was trained on 4 out of 58 Python files from the bigcode/starcoderdata dataset, primarily for code-related tasks.
Large Language Model Transformers Supports Multiple Languages
T
blueapple8259
22
1
Tinymistral V2 Pycoder Instruct 248m
Apache-2.0
A Python-specific code generation model developed based on TinyMistral-248M-v2-Instruct, supporting instruction-formatted Python code generation
Large Language Model Transformers Other
T
jtatman
16
3
Tinyllama Python Gguf
Apache-2.0
This is the quantized GGUF model file of rahuldshetty/tinyllama-python, fine-tuned based on unsloth/tinyllama-bnb-4bit, specializing in Python code generation tasks.
Large Language Model Supports Multiple Languages
T
rahuldshetty
32
1
Phi 1
MIT
Phi-1 is a 1.3 billion parameter Transformer model specifically designed for basic Python programming, achieving over 50% accuracy on the HumanEval benchmark
Large Language Model Transformers Supports Multiple Languages
P
microsoft
7,907
211
Llama 2 7b Int4 GPTQ Python Code 20k
Gpl-3.0
This is a 4-bit GPTQ quantized version of the Llama 2 7B model, specifically fine-tuned for Python code generation tasks
Large Language Model Transformers Other
L
edumunozsala
22
1
Evolcodellama 7b
Apache-2.0
A code generation model fine-tuned on the Evol-Instruct-Python-1k dataset using QLoRA (4-bit precision) based on the CodeLlama-7b-hf model
Large Language Model Transformers
E
mlabonne
34
6
Codellama 7B Python GGUF
CodeLlama 7B Python is a large language model with 7B parameters developed by Meta, focusing on Python code generation. It provides a quantized version in the GGUF format.
Large Language Model Transformers
C
TheBloke
2,385
57
Codellama 34b Python Hf
Code Llama is a 34-billion-parameter Python-specific code generation model developed by Meta, optimized based on the Llama 2 architecture, focusing on Python code synthesis and understanding
Large Language Model Transformers Other
C
codellama
2,135
96
Codet5p 220m Py
Bsd-3-clause
CodeT5+ is an open-source family of large language models for code, featuring an encoder-decoder architecture that supports various code understanding and generation tasks. This model is the 220M parameter version, specifically optimized for Python code generation.
Large Language Model Transformers
C
Salesforce
961
15
Codet5p 770m Py
Bsd-3-clause
CodeT5+ is an open-source family of large language models for code, featuring an encoder-decoder architecture that supports a wide range of code understanding and generation tasks. This model has been additionally fine-tuned for Python code.
Large Language Model Transformers
C
Salesforce
4,564
20
Tiny Starcoder Py
Openrail
This is a Python code generation model with 164 million parameters, based on the StarCoder architecture, specifically optimized for Python code generation tasks.
Large Language Model Transformers
T
bigcode
1,886
74
Codet5 Small Custom Functions Dataset Python
Apache-2.0
A Python code generation model fine-tuned based on Salesforce/codet5-small, specializing in custom function generation tasks
Large Language Model Transformers
C
sharoz
43
1
Codegen 350M Mono Custom Functions Dataset Python V2
Bsd-3-clause
A Python code generation model fine-tuned based on Salesforce/codegen-350M-mono, focusing on custom function generation
Large Language Model Transformers
C
sharoz
130
2
Codet5 Large Ntp Py
Bsd-3-clause
CodeT5 is a large-scale encoder-decoder model pre-trained with NTP objectives for Python language, focusing on code understanding and generation tasks
Large Language Model Transformers
C
Salesforce
217
27
Codegen 16B Mono
Bsd-3-clause
CodeGen-Mono 16B is an autoregressive language model for program synthesis, specializing in generating executable code from English prompts.
Large Language Model Transformers
C
Salesforce
227
126
Codegen 6B Mono
Bsd-3-clause
CodeGen is a series of autoregressive language models for program synthesis. CodeGen-Mono 6B is a 6B-parameter model further pre-trained on Python programming language datasets.
Large Language Model Transformers
C
Salesforce
600
38
Python Gpt2 Large Issues 128
Apache-2.0
GPT-2 large model fine-tuned based on bert-base-uncased, specializing in Python code-related issues
Large Language Model Transformers
P
aytugkaya
15
0
Gpt Neo 125M Apps All
MIT
A programming task-solving model fine-tuned on the APPS dataset based on GPT-Neo-125M
Large Language Model
G
flax-community
17
2
Codeparrot
CodeParrot is a Python code generation model based on the GPT-2 architecture (1.5 billion parameters), focusing on automatic Python code generation.
Large Language Model Transformers Other
C
codeparrot
1,342
105
Genji Python 6B Split
Apache-2.0
GPT-J 6B fine-tuned model for Python code generation, specialized in Python programming assistance
Large Language Model Transformers English
G
baffo32
16
0
Gpyt
MIT
GPyT is a GPT2 model trained from scratch on 80GB of pure Python code, focusing on Python code generation tasks.
Large Language Model Transformers Other
G
Sentdex
21
22
Nlgp Docstring
Apache-2.0
Python code generation model trained on Jupyter notebooks, capable of synthesizing code based on natural language intent within code context
Large Language Model Transformers Supports Multiple Languages
N
Nokia
45
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase